Graph Correspondence Transfer for Person Re-identification

نویسندگان

  • Qin Zhou
  • Heng Fan
  • Shibao Zheng
  • Hang Su
  • Xinzhe Li
  • Shuang Wu
  • Haibin Ling
چکیده

In this paper, we propose a graph correspondence transfer (GCT) approach for person re-identification. Unlike existing methods, the GCT model formulates person re-identification as an off-line graph matching and on-line correspondence transferring problem. In specific, during training, the GCT model aims to learn off-line a set of correspondence templates from positive training pairs with various pose-pair configurations via patch-wise graph matching. During testing, for each pair of test samples, we select a few training pairs with the most similar pose-pair configurations as references, and transfer the correspondences of these references to test pair for feature distance calculation. The matching score is derived by aggregating distances from different references. For each probe image, the gallery image with the highest matching score is the re-identifying result. Compared to existing algorithms, our GCT can handle spatial misalignment caused by large variations in view angles and human poses owing to the benefits of patch-wise graph matching. Extensive experiments on five benchmarks including VIPeR, Road, PRID450S, 3DPES and CUHK01 evidence the superior performance of GCT model over other state-of-the-art methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Person re-identification by pose priors

The person re-identification problem is a well known retrieval task that requires finding a person of interest in a network of cameras. In a real-world scenario, state of the art algorithms are likely to fail due to serious perspective and pose changes as well as variations in lighting conditions across the camera network. The most effective approaches try to cope with all these changes by appl...

متن کامل

Semantics-Aware Deep Correspondence Structure Learning for Robust Person Re-Identification

In this paper, we propose an end-to-end deep correspondence structure learning (DCSL) approach to address the cross-camera person-matching problem in the person re-identification task. The proposed DCSL approach captures the intrinsic structural information on persons by learning a semanticsaware image representation based on convolutional neural networks, which adaptively learns discriminative...

متن کامل

Multi-Channel Pyramid Person Matching Network for Person Re-Identification

In this work, we present a Multi-Channel deep convolutional Pyramid Person Matching Network (MC-PPMN) based on the combination of the semantic-components and the colortexture distributions to address the problem of person reidentification. In particular, we learn separate deep representations for semantic-components and color-texture distributions from two person images and then employ pyramid ...

متن کامل

Learning Appearance Transfer for Person Re-identification

In this chapter we review methods that model the transfer a person’s appearance undergoes when passing between two cameras with non-overlapping fields of view. Whereas many recent studies deal with re-identifying a person at any new location and search for universal signatures and metrics, here we focus on solutions for the natural setup of surveillance systems in which the cameras are specific...

متن کامل

People Re-identification Using a Graph Kernels Approach

This paper addresses people re-identification problem for visual surveillance applications. Our approach is based on a rich description of each occurrence of a person thanks to a graph encoding of its salient points. The appearance of persons in a video is encoded by bags of graphs whose similarities are encoded by a graph kernel. Such similarities combined with a tracking system allow us to di...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017